Search results for "Entropy function"
showing 8 items of 8 documents
Combinatorics of theSU(2)black hole entropy in loop quantum gravity
2009
We use the combinatorial and number-theoretical methods developed in previous works by the authors to study black hole entropy in the new proposal put forth by Engle, Noui, and Perez. Specifically, we give the generating functions relevant for the computation of the entropy and use them to derive its asymptotic behavior, including the value of the Immirzi parameter and the coefficient of the logarithmic correction.
Probabilities, States, Statistics
2016
In this chapter we clarify some important notions which are relevant in a statistical theory of heat: The definitions of probability measure, and of thermodynamic states are illustrated, successively, by the classical Maxwell-Boltzmann statistics, by Fermi-Dirac statistics and by Bose-Einstein statistics. We discuss observables and their eigenvalue spectrum as well as entropy and we calculate these quantities for some examples. The chapter closes with a comparison of statistical descriptions of classical and quantum gases.
Entropy function from toric geometry
2021
It has recently been claimed that a Cardy-like limit of the superconformal index of 4d $\mathcal{N}=4$ SYM accounts for the entropy function, whose Legendre transform corresponds to the entropy of the holographic dual AdS$_5$ rotating black hole. Here we study this Cardy-like limit for $\mathcal{N}=1$ toric quiver gauge theories, observing that the corresponding entropy function can be interpreted in terms of the toric data. Furthermore, for some families of models, we compute the Legendre transform of the entropy function, comparing with similar results recently discussed in the literature.
Fractional-order theory of thermoelasticicty. I: Generalization of the Fourier equation
2018
The paper deals with the generalization of Fourier-type relations in the context of fractional-order calculus. The instantaneous temperature-flux equation of the Fourier-type diffusion is generalized, introducing a self-similar, fractal-type mass clustering at the micro scale. In this setting, the resulting conduction equation at the macro scale yields a Caputo's fractional derivative with order [0,1] of temperature gradient that generalizes the Fourier conduction equation. The order of the fractional-derivative has been related to the fractal assembly of the microstructure and some preliminary observations about the thermodynamical restrictions of the coefficients and the state functions r…
An algorithmic construction of entropies in higher-order nonlinear PDEs
2006
A new approach to the construction of entropies and entropy productions for a large class of nonlinear evolutionary PDEs of even order in one space dimension is presented. The task of proving entropy dissipation is reformulated as a decision problem for polynomial systems. The method is successfully applied to the porous medium equation, the thin film equation and the quantum drift–diffusion model. In all cases, an infinite number of entropy functionals together with the associated entropy productions is derived. Our technique can be extended to higher-order entropies, containing derivatives of the solution, and to several space dimensions. Furthermore, logarithmic Sobolev inequalities can …
On the Evaluation of Images Complexity: A Fuzzy Approach
2006
The inherently multidimensional problem of evaluating the complexity of an image is of a certain relevance in both computer science and cognitive psychology. Computer scientists usually analyze spatial dimensions, to deal with automatic vision problems, such as feature-extraction. Psychologists seem more interested in the temporal dimension of complexity, to explore attentional models. Is it possible, by merging both approaches, to define an more general index of visual complexity? We have defined a fuzzy mathematical model of visual complexity, using a specific entropy function; results obtained by applying this model to pictorial images have a strong correlation with ones from an experime…
Extropy: Complementary Dual of Entropy
2015
This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…
Estimating the decomposition of predictive information in multivariate systems
2015
In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of co…